Skip to main content

Home/ TOK@ISPrague/ Group items tagged confirmation bias

Rss Feed Group items tagged

Lawrence Hrubes

How a Gay-Marriage Study Went Wrong - The New Yorker - 1 views

  • ast December, Science published a provocative paper about political persuasion. Persuasion is famously difficult: study after study—not to mention much of world history—has shown that, when it comes to controversial subjects, people rarely change their minds, especially if those subjects are important to them. You may think that you’ve made a convincing argument about gun control, but your crabby uncle isn’t likely to switch sides in the debate. Beliefs are sticky, and hardly any approach, no matter how logical it may be, can change that. The Science study, “When contact changes minds: An experiment on transmission of support for gay equality,” seemed to offer a method that could work.
  • In the document, “Irregularities in LaCour (2014),” Broockman, along with a fellow graduate student, Joshua Kalla, and a professor at Yale, Peter Aronow, argued that the survey data in the study showed multiple statistical irregularities and was likely “not collected as described.”
  • If, in the end, the data do turn out to be fraudulent, does that say anything about social science as a whole? On some level, the case would be a statistical fluke. Despite what news headlines would have you believe, outright fraud is incredibly rare; almost no one commits it, and almost no one experiences it firsthand. As a result, innocence is presumed, and the mindset is one of trust.
  • ...2 more annotations...
  • There’s another issue at play: the nature of belief. As I’ve written before, we are far quicker to believe things that mesh with our view of how life should be. Green is a firm supporter of gay marriage, and that may have made him especially pleased about the study. (Did it have a similar effect on liberally minded reviewers at Science? We know that studies confirming liberal thinking sometimes get a pass where ones challenging those ideas might get killed in review; the same effect may have made journalists more excited about covering the results.)
  • In short, confirmation bias—which is especially powerful when we think about social issues—may have made the study’s shakiness easier to overlook.
markfrankel18

The false vaccine debate shows we're in a golden age of believing whatever we want - Qu... - 0 views

  • Today, the temptation to engage in confirmation bias is even greater since information confirming your particular bias is only a click away. Put another way: if you’re in the business of cherry picking data, this is a golden age, for the cherries are more plentiful than ever, and far easier to pick.
  • social media “may intensify not only feelings of loneliness, but also ideological isolation.”
  • Psychologist Jonathan Haidt, in his book The Righteous Mind, says our minds act less like scientists, boldly going where the facts lead, and more like press secretaries, stubbornly defending our core positions no matter what.
Lawrence Hrubes

BBC - Future - How to get people to overcome their bias - 0 views

  • How do you persuade somebody of the facts? Asking them to be fair, impartial and unbiased is not enough. To explain why, psychologist Tom Stafford analyses a classic scientific study.
  •  
    How do you persuade somebody of the facts? Asking them to be fair, impartial and unbiased is not enough. To explain why, psychologist Tom Stafford analyses a classic scientific study.
markfrankel18

Is the Field of Psychology Biased Against Conservatives? - 0 views

  • Perhaps even more potentially problematic than negative personal experience is the possibility that bias may influence research quality: its design, execution, evaluation, and interpretation. In 1975, Stephen Abramowitz and his colleagues sent a fake manuscript to eight hundred reviewers from the American Psychological Association—four hundred more liberal ones (fellows of the Society for the Psychological Study of Social Issues and editors of the Journal of Social Issues) and four hundred less liberal (social and personality psychologists who didn’t fit either of the other criteria). The paper detailed the psychological well-being of student protesters who had occupied a college administration building and compared them to their non-activist classmates. In one version, the study found that the protesters were more psychologically healthy. In another, it was the more passive group that emerged as mentally healthier. The rest of the paper was identical. And yet, the two papers were not evaluated identically. A strong favorable reaction was three times more likely when the paper echoed one’s political beliefs—that is, when the more liberal reviewers read the version that portrayed the protesters as healthier.
  • All these studies and analyses are classic examples of confirmation bias: when it comes to questions of subjective belief, we more easily believe the things that mesh with our general world view. When something clashes with our vision of how things should be, we look immediately for the flaws.
Lawrence Hrubes

The Bitter Fight Over the Benefits of Bilingualism - The Atlantic - 0 views

  • It’s an intuitive claim, but also a profound one. It asserts that the benefits of bilingualism extend well beyond the realm of language, and into skills that we use in every aspect of our lives. This view is now widespread, heralded by a large community of scientists, promoted in books and magazines, and pushed by advocacy organizations.
  • But a growing number of psychologists say that this mountain of evidence is actually a house of cards, built upon flimsy foundations.
  • Jon Andoni Duñabeitia, a cognitive neuroscientist at the Basque Center on Cognition, Brain, and Language, was one of them. In two large studies, involving 360 and 504 children respectively, he found no evidence that Basque kids, raised on Basque and Spanish at home and at school, had better mental control than monolingual Spanish children.
  • ...1 more annotation...
  • Similar controversies have popped up throughout psychology, fueling talk of a “reproducibility crisis” in which scientists struggle to duplicate classic textbook results. In many of these cases, classic psychological phenomena that seem to be backed by years of supportive evidence, suddenly become fleeting and phantasmal. The causes are manifold. Journals are more likely to accept positive, attention-grabbing papers than negative, contradictory ones, which pushes scientists towards running small studies or tweaking experiments on the fly—practices that lead to flashy, publishable discoveries that may not actually be true.
markfrankel18

Climate buffoons' real motives: 5 reasons they still spout debunked garbage - Salon.com - 1 views

  • The most simplistic of climate deniers are those who looked out their windows this winter, saw that it was snowing, and reasoned that global warming therefore can’t be real. This speaks to a basic confusion of the difference between weather and climate. (If you’d like a much more thorough debunking of weather-based climate change denial, read this.)It’s also a classic example of confirmation bias: Deniers get giddy when it snows because it appears to confirm their belief that Earth isn’t really getting warmer. To understand why that doesn’t make sense, one need only look at the average global temperatures. Yes, it was very cold in parts of the U.S., but zoom out and it becomes clear that last month, overall, was the fourth-warmest January in recorded history.In some cases, it could be a fear of science that is driving this type of thinking.
  • A misunderstanding of what scientists take as “proof” may also be responsible for this confusion.
Lawrence Hrubes

Why Facts Don't Change Our Minds - The New Yorker - 0 views

  • Surveys on many other issues have yielded similarly dismaying results. “As a rule, strong feelings about issues do not emerge from deep understanding,” Sloman and Fernbach write. And here our dependence on other minds reinforces the problem. If your position on, say, the Affordable Care Act is baseless and I rely on it, then my opinion is also baseless. When I talk to Tom and he decides he agrees with me, his opinion is also baseless, but now that the three of us concur we feel that much more smug about our views. If we all now dismiss as unconvincing any information that contradicts our opinion, you get, well, the Trump Administration.
markfrankel18

Gamblers, Scientists and the Mysterious Hot Hand - The New York Times - 0 views

  • The opposite of that is the hot-hand fallacy — the belief that winning streaks, whether in basketball or coin tossing, have a tendency to continue, as if propelled by their own momentum. Both misconceptions are reflections of the brain’s wired-in rejection of the power that randomness holds over our lives. Look deep enough, we instinctively believe, and we may uncover a hidden order.
  • A working paper published this summer has caused a stir by proposing that a classic body of research disproving the existence of the hot hand in basketball is flawed by a subtle misperception about randomness. If the analysis is correct, the possibility remains that the hot hand is real.
  • Taken to extremes, seeing connections that don’t exist can be a symptom of a psychiatric condition called apophenia. In less pathological forms, the brain’s hunger for pattern gives rise to superstitions (astrology, numerology) and is a driving factor in what has been called a replication crisis in science — a growing number of papers that cannot be confirmed by other laboratories.
markfrankel18

The Science of Why We Don't Believe Science | Mother Jones - 0 views

  • "A MAN WITH A CONVICTION is a hard man to change. Tell him you disagree and he turns away. Show him facts or figures and he questions your sources. Appeal to logic and he fails to see your point."
  • The theory of motivated reasoning builds on a key insight of modern neuroscience (PDF): Reasoning is actually suffused with emotion (or what researchers often call "affect"). Not only are the two inseparable, but our positive or negative feelings about people, things, and ideas arise much more rapidly than our conscious thoughts, in a matter of milliseconds—fast enough to detect with an EEG device, but long before we're aware of it. That shouldn't be surprising: Evolution required us to react very quickly to stimuli in our environment. It's a "basic human survival skill," explains political scientist Arthur Lupia of the University of Michigan. We push threatening information away; we pull friendly information close. We apply fight-or-flight reflexes not only to predators, but to data itself. We apply fight-or-flight reflexes not only to predators, but to data itself. We're not driven only by emotions, of course—we also reason, deliberate. But reasoning comes later, works slower—and even then, it doesn't take place in an emotional vacuum. Rather, our quick-fire emotions can set us on a course of thinking that's highly biased, especially on topics we care a great deal about.
  • In other words, when we think we're reasoning, we may instead be rationalizing. Or to use an analogy offered by University of Virginia psychologist Jonathan Haidt: We may think we're being scientists, but we're actually being lawyers
  • ...2 more annotations...
  • A key question—and one that's difficult to answer—is how "irrational" all this is. On the one hand, it doesn't make sense to discard an entire belief system, built up over a lifetime, because of some new snippet of information.
  • Okay, so people gravitate toward information that confirms what they believe, and they select sources that deliver it. Same as it ever was, right? Maybe, but the problem is arguably growing more acute, given the way we now consume information—through the Facebook links of friends, or tweets that lack nuance or context, or "narrowcast" and often highly ideological media that have relatively small, like-minded audiences. Those basic human survival skills of ours, says Michigan's Arthur Lupia, are "not well-adapted to our information age."
1 - 11 of 11
Showing 20 items per page